Asynchronous Stochastic Variational Inference

نویسندگان

  • Saad Mohamad
  • Abdelhamid Bouchachia
  • Moamar Sayed Mouchaweh
چکیده

Stochastic variational inference (SVI) employs stochastic optimization to scale up Bayesian computation to massive data. Since SVI is at its core a stochastic gradient-based algorithm, horizontal parallelism can be harnessed to allow larger scale inference. We propose a lock-free parallel implementation for SVI which allows distributed computations over multiple slaves in an asynchronous style. We show that our implementation leads to linear speed-up while guaranteeing an asymptotic ergodic convergence rate O(1/ √ T ) given that the number of slaves is bounded by √ T (T is the total number of iterations). The implementation is done in a high-performance computing (HPC) environment using message passing interface (MPI) for python (MPI4py). The extensive empirical evaluation shows that our parallel SVI is lossless, performing comparably well to its counterpart serial SVI with linear speed-up.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Asynchronous Distributed Variational Gaussian Process for Regression

Gaussian processes (GPs) are powerful nonparametric function estimators. However, their applications are largely limited by the expensive computational cost of the inference procedures. Existing stochastic or distributed synchronous variational inferences, although have alleviated this issue by scaling up GPs to millions of samples, are still far from satisfactory for real-world large applicati...

متن کامل

Adaptively Setting the Learning Rate in Stochastic Variational Inference

Stochastic variational inference is a promising method for fitting large-scale probabilistic models with hidden structures. Different from traditional stochastic learning, stochastic variational inference uses the natural gradient, which is particularly efficient for computing probabilistic distributions. One of the issues in stochastic variational inference is to set an appropriate learning ra...

متن کامل

Scalable Inference Algorithms for Clustering Large Networks

Clustering is an important task in network analysis, with applications in fields such as biology and the social sciences. We present a novel inference algorithm for the Stochastic Block Model (SBM), a well known network clustering model. Previous inference in this model typically utilizes Markov Chain Monte Carlo or Variational Bayes, but our method is the first to utilize Stochastic Variationa...

متن کامل

Variational Inference on Deep Exponential Family by using Variational Inferences on Conjugate Models

In this paper, we propose a new variational inference method for deep exponentialfamily (DEF) models. Our method converts non-conjugate factors in a DEF model to easy-to-compute conjugate exponential-family messages. This enables local and modular updates similar to variational message passing, as well as stochastic natural-gradient updates similar to stochastic variational inference. Such upda...

متن کامل

Truncation-free Stochastic Variational Inference for Bayesian Nonparametric Models

We present a truncation-free stochastic variational inference algorithm for Bayesian nonparametric models. While traditional variational inference algorithms require truncations for the model or the variational distribution, our method adapts model complexity on the fly. We studied our method with Dirichlet process mixture models and hierarchical Dirichlet process topic models on two large data...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1801.04289  شماره 

صفحات  -

تاریخ انتشار 2018